![]() Systems and methods for determining the filling of a commercial trailer
专利摘要:
A three-dimensional (3D) depth imaging system is provided for use in commercial trailer loading applications. A 3D depth camera can be set up and oriented to capture 3D image data from a vehicle storage facility. A depth detection application running on one or more processors can determine at least a wall data area and a non-wall data area based on the 3D image data. Based on the determination of the wall data area and the non-wall data area, the depth detection application generates a wall indicator that indicates that a wall is located at a separate depth within the vehicle storage area. 公开号:BE1025931B1 申请号:E20185904 申请日:2018-12-18 公开日:2020-04-01 发明作者:Miroslav Trajkovic;Justin F Barish;Adithya H Krishnamurthy 申请人:Symbol Technologies Llc; IPC主号:
专利说明:
Systems and methods for determining the filling of a commercial trailer FIELD OF THE INVENTION The invention generally relates to commercial vehicles and, more specifically, to systems and methods for determining the filling of commercial vehicles. BACKGROUND Three-dimensional (3D) cameras can be used to determine distances from the camera to objects positioned in the vicinity of the camera. These 3D cameras can be used to determine the filling of a commercial vessel or trailer, since even a small magnification in the filling can lead to significant savings in transport costs. Such fill determinations can be troublesome due to object interference or disturbances that can adversely affect calculations. For example, in situations that require sensing surface areas within a large storage area, such as a commercial trailer storage area, the storage area may include moving objects, such as people, packages being moved, loading vehicles, etc., that may cause the 3D camera produces inaccurate data regarding the size or dimensions of the storage space. Such inaccurate data can be particularly problematic for end-use applications based on the data to perform calculations or other data manipulations to facilitate depth or other 3D determinations, and can cause inaccurate measurement of filled measurement which can adversely affect the charging efficiency. In addition, object interferences can reduce the system processing times required to fill the trailer BE2018 / 5904 determine increase, which in turn can further increase loading times and associated costs. Accordingly, there is a need for depth display systems and methods for use in loading a commercial trailer. SUMMARY In accordance with an aspect of the invention, there is provided a three-dimensional (3D) depth imaging system for use in commercial trailer loading applications, the 3D depth imaging system comprising a 3D depth camera arranged for recording 3D image data, the 3D depth camera is oriented in a direction to record 3D image data from a vehicle storage space, and a depth detection application running on one or more processors, the depth detection application, based on the 3D image data, at least one wall data area and a non-wall data area, wherein the determination of the wall data area and the non-wall data area prompts the depth detection application to generate a wall indicator, the wall indicator indicating that a wall is located at a separate depth within the vehicle storage space, The 3D depth camera and one or more processors can be housed in an attachable device. The wall may preferably be one of a package wall or a wall of the vehicle storage space. The non-wall data area can be associated with one of a loader or a vehicle storage depot. For example, the non-wall data area can be determined by comparing historical 3D image data to identify at least one peak area. BE2018 / 5904 The depth detection application can preferably be configured to ignore non-wall data areas when generating the wall indicator. The 3D depth mapping system may preferably further include a dashboard application, the dashboard application being performed on a client device, and the determination of the wall data area and the non-wall data area may further trigger the dashboard app to receive the wall indicator. The dashboard application may preferably include a vehicle storage capacity capacity value, the vehicle storage capacity capacity value indicating a filled capacity of the vehicle storage space. Alternatively or additionally, the dashboard application may include a vehicle storage capacity capacity value, the vehicle storage capacity value indicating a residual capacity of the vehicle storage space. The 3D image data can be, for example, a 3D point cloud data. The 3D image data can preferably be recorded periodically. For example, the 3D image data can be recorded approximately every 15 seconds, approximately every 30 seconds, approximately every minute, or approximately every two minutes. According to a further aspect of the invention, there is provided a 3D depth imaging method for use in loading a commercial trailer, the 3D depth imaging method comprising recording, via a 3D depth camera, 3D image data from a vehicle storage space. The 3D depth camera is oriented in a direction to record the 3D image data from the vehicle storage space. The method further comprises analyzing, via one or more processors, the 3D image data with a depth detection application. The depth detection application determines, based on BE2018 / 5904 the 3D image data, at least one wall data area and a non-wall data area. The method generates, through the depth sensing application, a wall indicator that indicates a wall located at a separate depth within the vehicle storage area. Analyzing the 3D image data may include calculating a histogram of the depths of the 3D image data, identifying a multiple number of histogram peaks, identifying a highest peak from the multiple number of histogram peaks, and filtering the multiple number of histogram peaks that have a height that is outside a threshold of the highest peak, and identifying a peak from the multiple number of histogram peaks that is farthest from the 3D depth camera and filtering the remaining multiple number of histogram peaks that are located at a distance from the peak that has the furthest distance from the 3D depth camera. The 3D depth mapping method may further determine whether the peak having the furthest distance from the 3D depth camera is the current wall. Analyzing the 3D image data may further include identifying a highest remaining peak from the remaining multiple histogram peaks and generating a wall indicator. The 3D depth mapping method may also further include calculating an average wall placed between the wall indicator and a total depth of the vehicle storage space. The non-wall data area can be associated with a loader or a vehicle storage depot. For example, the non-wall data area can be determined by comparing historical 3D image data to identify at least one peak area. BE2018 / 5904 The depth detection application can be set to ignore non-wall data areas when generating the wall indicator. Preferably, the 3D depth mapping method may further include receiving, via a dashboard application running on a client device, the wall indicator. The dashboard application may include a vehicle storage capacity capacity value, the vehicle storage capacity value indicating a filled capacity of the vehicle storage space. For example, the 3D image data can be recorded approximately every 15 seconds, approximately every 30 seconds, approximately every minute, or approximately every 2 minutes. BRIEF DESCRIPTION OF THE FIGURES The above requirements are at least partially met by providing the systems and methods for determining the commercial trailer's fill that are described in the following figure description, in particular in combination with the drawing in which: FIG. 1 shows a perspective view, seen from above, of a loading dock comprising a loading facility, a plurality of loading bays, a plurality of vehicles, and a plurality of vehicle storage spaces, in accordance with exemplary embodiments herein, FIG. 2A shows a perspective view of the loading facility of FIG. 1 depicting a vehicle storage space associated with a loading bay, in accordance with exemplary embodiments herein, FIG. 2B shows a perspective view of a trailer monitoring unit (TMU) of FIG. 2A, in accordance with exemplary embodiments herein, BE2018 / 5904 FIG. 3 shows a block diagram representing an embodiment of a server associated with the loading facility of FIG. 2A and the TMU of FIG. 2B, FIG. 4A shows a photo-realistic view showing a first embodiment of the vehicle storage space associated with the loading bay of FIG. 2A wherein the storage space comprises a package wall, FIG. 4B shows a histogram representation of the photo-realistic view of FIG. 4A which includes a detected packet wall, FIG. 4C shows a histogram representation of the photorealistic view of FIG. 4A that is scaled or weighted to correct for objects positioned near the TMU, FIGS. 5A-5C show historical photo-realistic views showing a second embodiment of the vehicle storage space associated with the loading bay of FIG. 2A, where the storage space comprises a parcel wall and a loader positioned at various locations within the vehicle storage space, FIGS. 5D-5F show histogram views of the historical photorealistic views of FIGS, respectively. 5A-5C, FIG. 5G is a modified histogram display of FIGS. 5D-5F showing data representing the loader and packages minimized at temporary locations, FIG. 6 shows a flow chart of a depth mapping method for use in loading a commercial trailer, and FIGS. 7A-7C show histogram views showing steps for determining a dominant package wall. It is noted that elements in the figures are illustrated for simplicity and clarity and are not necessarily drawn to scale. For example, the dimensions and / or relative positioning of some of the elements in the figures may be exaggerated relative to BE2018 / 5904 other elements to help improve understanding of various embodiments of the invention. Also, everyday but well-understood elements useful or necessary in a commercially suitable embodiment have often not been shown to provide a less obstructed view of these various embodiments. Certain actions and / or steps may be described or shown in a specific order of occurrence while it is apparent that such a specific order choice is not actually required. The terms and expressions as used herein have the usual technical meaning as is conventionally assigned to such terms and expressions as set forth above, except when different specific meanings are otherwise set forth herein. DETAILED DESCRIPTION It describes systems and methods that enable accurate determination of filling percentages of commercial trailers. These systems and methods use 3D cameras to determine a “fill distance” or a “wall” of packages closest to the trailer entrance. By calculating a histogram of depth measurements for the trailer and making necessary corrections, the systems described methods can accurately account for interference, while being able to quickly calculate current trailer fill rates. The techniques described enable more accurate reporting and display of vehicle capacity and filled data, thereby reducing false reports or other inaccurate data and reporting, for example, in graphical representations of a vehicle storage space, as implemented by loading applications described herein. Accordingly, in various embodiments described herein, three-dimensional (3D) depth imaging systems for use in loading a commercial trailer BE2018 / 5904 described. For example, a 3D depth camera can be set and oriented to record 3D image data from a vehicle storage area. A depth detection application running on one or more processors can determine at least one wall data area and a non-wall data area based on the 3D image data. Based on the determination of the wall data and the non-wall data area, the depth detection application generates a wall indicator indicating a wall situated at a separate depth within the vehicle storage area. FIG. 1 is a perspective view from above of a loading dock 100 comprising a loading facility 101, a plurality of loading bays 102d-110d, a multiple number of vehicles 106v and 110v, and a plurality of vehicle storage spaces 102s-110s, in accordance with exemplary embodiments in here. For example, in some embodiments, loading dock 100 may be associated with a retail store, wholesale store, or other such commercial building. In other embodiments, loading bay 100 may be associated with a storage facility or a waypoint facility for housing packages, boxes or other transportable objects or goods typically involved in distribution and logistics of such transportable objects and goods. Additional embodiments are contemplated herein such that loading dock 100 houses the loading and unloading of transportable objects or goods at a store, facility, or other such location. FIG. 1 shows, for example, a loading facility 101 which, as described, may be a retail store, a storage facility or other such location that has a place for loading and unloading transportable objects and goods. Loading facility 101 includes a plurality of loading bays 102d-110d. For example, loading bay 104d is shown as uncoupled, and includes an opening of a size equal to or similar to that of an opening of vehicle storage space. As BE2018 / 5904 shown in FIG. 1, cargo bay 104d may further include padding or insulation to accommodate the trailer (e.g., a vehicle storage space) against the wall of the loading facility 101. Cargo bay 104d may further include a retractable door positioned within the opening of cargo bay 104d where the door may opened to provide access to the vehicle storage space of a trailer of the loading facility 101. As described herein, loading bay 104d is representative of the remaining loading bays shown, such as loading bays 102d, 106d, 108d and 110d, where loading bays 102d, 106d, 108d and 110d are similar may have properties or functionality as described herein for loading bay 104d. In various embodiments, an opening of a vehicle storage space may be the opening of a trailer, where the trailer may be towed by a trailer, tractor trailer, truck or other such vehicle capable of coupling and moving a trailer (e.g. a vehicle storage space), as described herein. In some embodiments, the floor of a trailer, when coupled, may be the same or about the same as the floor of a loading dock (e.g., loading bays 102d-110d) of loading facility 101. FIG. 1 also shows a plurality of vehicle storage spaces 102s, 106s and 110s. Vehicle storage spaces 102s, 106s and 110s can each be storage spaces associated with a vehicle, for example, a trailer or other transportable storage space (for example, 102s, 106s, and 110s) associated with a trailer, tractor-trailer, truck or other such large vehicle (for example, 106v and 110v) as described herein. As shown, for example, in FIG. 1, each of the vehicles 106v and 110v is associated with vehicle storage spaces 106s and 110s, respectively. Each of the vehicles 106 and 110v may be responsible for maneuvering their respective vehicle storage spaces 106s and 110s to respective loading compartments, such as loading compartments 106d and 110d. BE2018 / 5904 As described herein, each of the vehicle storage spaces 102s, 106s and 110s includes openings, generally at one end, that are the same or similar size as the openings of the loading compartments 102d-110d. In this manner, the vehicle storage spaces 102s, 106s, and 110s may cooperate with or be coupled to the loading compartments 102d-110d to accommodate the loading and unloading of packages, boxes, or other transportable objects or goods as described herein. As shown, for example, in FIG. 1, the vehicle storage space 102s is shown as a trailer coupled to loading compartment 102d. Accordingly, the opening of vehicle storage space 102s cooperates with the opening of loading compartment 102d such that the interior of vehicle storage spaces 102s can be seen or accessed from loading compartment 102d. Likewise, vehicle storage 110s is also shown as a trailer coupled to loading bay 110d, the opening of vehicle storage 110s cooperating with the opening of loading bay 110d such that the interior of vehicle storage 110s can be seen or accessed from loading bay 110d. Vehicle storage space 106s is shown as not currently coupled with respect to loading bay 106d. Vehicle storage areas, such as 102s, 106s, and 110s can have different sizes, lengths, or other dimensions. For example, in one embodiment, the vehicle storage space 102s can be associated with a 63 foot long trailer, vehicle storage space can be associated with a 53 foot long trailer, and vehicle storage space 110s can be associated with a 73 foot long trailer. Other variations of vehicle storage space dimensions, sizes and / or lengths may be considered herein. FIG. 2A is a perspective view 200 of the loading facility 101 of FIG. 1 showing vehicle storage space 102s associated with a loading bay 102d, in accordance with exemplary embodiments herein. FIG. 2A shows, for example, vehicle storage space BE2018 / 5904 102s, which in the embodiment of FIG. 2A is an interior view of the vehicle storage space 102s of FIG. 1. FIG. 2A also shows loading compartment 102d, which in the embodiment of FIG. 2A is an interior view of loading bay 102d of FIG. 1. As shown in FIG. 2A, vehicle storage space 102s is coupled to loading bay 102d, the inside of vehicle storage space 102s being accessible from the inside of loading facility 101. Vehicle storage space 102s includes packages, boxes and / or other transportable objects or goods, including packages 208p1208p3, which, in some embodiments, correspond can come with package walls, as described herein. The packages 208p1-208p3 can be in a state of being loaded or unloaded in the vehicle storage space 102s. For example, worker 212 may be in a state of loading or unloading additional packages 210 into or out of vehicle storage space 102s. In some embodiments, manager 206 may oversee, supervise, or otherwise facilitate the loading and unloading of packages, boxes, and / or other transportable objects or goods (e.g., packages 208p1-208p3 or 210) in or out of the vehicle storage area 102s. For example, manager 206 can use a dashboard app running on a client device 204, as described herein. FIG. 2A also shows a trailer monitor unit (TMU) 202. TMU 202 may be an attachable device that includes a 3D depth camera for recording 3D images (e.g., 3D image data) and a photo-realistic camera (e.g., 2D image data). The photo-realistic camera can be an RGB (red, green, blue) camera for recording 2D images. The TMU 202 may also have one or more processors and one or more computer memories for storing image data and / or running apps that perform analysis or other functions as described herein. In various embodiments and as shown in FIG. 2A, the TMU 202 can be mounted within the loading facility 101 and oriented toward vehicle storage space 102s to capture 3D and / or 2D image data from the BE2018 / 5904 to include inside of vehicle storage room 102s. As shown, for example, in FIG. 2A, TMU 202 can be oriented such that the 3D and 2D cameras of TMU 202 look lengthwise from the vehicle storage area 102s so that TMU 202 the walls, floor, ceiling, packages (e.g. 208p1-208p3 or 210) or other objects or surfaces within vehicle storage space 102s can scan or record to determine the 3D and 2D image data. The image data can be processed by the one or more processors and / or memories of the TMU 202 (or in some embodiments, one or more remote processors and / or memories of a server) to implement analysis, functions such as graphical or image analysis, such as described in one or more of several flowcharts, block diagrams, methods, functions or various embodiments herein. For example, in some embodiments, the TMU 202 can process the 3D and 2D image data, such as scanned or recorded from the 3D depth camera and photo-realistic camera for use by other devices (e.g., client device 204 or server 301, as described further herein). For example, the one or more processors and / or one or more memories of the TMU 202 can process the image data scanned or recorded from vehicle storage space 102s. The processing of the image data can generate post-scan data which may include metadata, simplified data, normalized data, result data, status data or alertness data as determined from the original scanned or recorded image data. In some embodiments, the image data and / or the post-scan data can be sent to a client application, such as a dashboard application (app) described herein, for inspection, manipulation or other interaction. In other embodiments, the image data and / or the post-scan data can be sent to a server (e.g. server 301 as further described herein) for storage or for further manipulation. BE2018 / 5904 As shown in FIG. 2A, the image data and / or the post-scan data can be received on client device 204. Client device 204 can implement a dashboard app for receiving the image data and / or the post-scan data and displaying such data, for example in graphic or other format to manager 206 to facilitate unloading or loading packages (e.g., 208p1-208p3 or 210), as described herein. In some embodiments, the dashboard app can be implemented via a web platform, such as Java J2EE (e.g. Java Sever Faces) or Ruby on Rails. In such embodiments, the web platform may generate or update a dashboard app user interface through generation of a dynamic web page (e.g. using HTML, CSS, JavaScript) or through a client-facing mobile app (e.g. via Java for a Google Android based app or Objective-C / Swift for an Apple iOS based app), where the user interface is shown via the dashboard app on the client device, e.g. client device 204. In some embodiments, the dashboard app can receive the image data and / or the post-scan data and display such data directly. Client device 204 may be a mobile device, such as a tablet, smartphone, laptop or other such mobile computing device. Client device 204 may implement an operating system or platform for running the dashboard (or other) apps or functionality, including, for example, the Apple iOS platform, the Google Android platform and / or the Microsoft Windows platform. Client device 204 may include one or more processors and / or one or more memories that implement the dashboard app or provide other similar functionality. Client device 204 may also include wired or wireless transceivers for receiving image data and / or post-scan data as described herein. Such wired or wireless transceivers can implement one or more communication protocol standards, BE2018 / 5904 including, for example, TCP / IP, WiFi, (802.11b), Bluetooth or any other comparable communication protocol or standard. In some embodiments, the image data and / or the post-scan data can be sent to a server or server, such as server 301 described herein. In such embodiments, the server or server may generate post-scan data that may include metadata, simplified data, normalized data, result data, status data, or alertness data, as determined from the original scanned or recorded image data provided by the TMU 202. As herein described, the server or the centralized entity can store such data and can also send the image data and / or the post-scan data to a dashboard app, or other app, implemented on client device, such as the dashboard app implemented on client device 204 of FIG. 2A. FIG. 2B is a perspective view of the TMU 202 of FIG. 2A, in accordance with exemplary embodiments herein. In the exemplary embodiment of FIG. 2B, TMU 202 may include a mounting bracket 252 for orienting or otherwise positioning the TMU 202 within the loading facility 101 as described herein. The TMU 202 may further comprise one or more processors and one or more memories for processing image data as described herein. For example, the TMU 202 may include a flash memory used to determine, store, or otherwise process the image data and / or post-scan data. In addition, TMU 202 may further include a network interface to enable communication with other devices (such as server 301 of Figure 3 as described herein). The network interface of TMU 202 may include any suitable type of communication interface (s) (e.g. wired and / or wireless interfaces) configured to operate in accordance with any suitable protocol, e.g. Ethernet for wired connections, and / or IEEE 802.11 for wireless connections. BE2018 / 5904 TMU 202 may include a 3D depth camera 254 for recording, recording or scanning 3D image data. For example, in some embodiments, the 3D depth camera 254 may include an infrared (IR) projector and a related IR camera. In such embodiments, the IR projector projects a pattern of IR light or beams onto an object or surface that, in various embodiments herein, surfaces of a vehicle storage space (e.g., vehicle storage space 102s) or objects within the vehicle storage space, such as boxes or packages (e.g., packages 208p1- 208p3 or 210). The IR light or beams can be distributed over the object or surface in a pattern of dots or dots by the IR projector that can be registered or scanned by the IR camera. A depth detection app, such as a depth detection app running on the one or more processors or memories of TMU 202, can determine various depth values based on the pattern of dots or dots, for example, depth values of vehicle storage space 102s. For example, a close-depth object (e.g., near boxes, packages, etc.) can be determined where the dots or dots are closed, and deposited depth objects (e.g., distant boxes, packages, etc.) can be determined where the points are more spread out. The various depth values can be used by the depth detection app and / or TMU 202 to generate a depth map. The depth map may display a 3D image of, or contain 3D image data of, the objects or surfaces recorded or scanned by the 3D depth camera 254, for example, the vehicle storage space 102s and any object or surface therein. The TMU 202 may further include a photo-realistic camera 256 for recording, recording, or scanning 2D image data. The photorealistic camera 256 can be an RGB (red, green, blue) based camera for recording 2D images that is RGB based pixel BE2018 / 5904 have data. In some embodiments, the photorealistic camera can record 256 2D images and related 2D image data at the same or similar point in time as the 3D depth camera 254 such that the TMU 202 has both sets of 3D image data and 2D image data available for a specific surface, object , or scene at the same or comparable time in time. FIG. 3 is a block diagram representative of an embodiment of a server associated with the loading facility 101 of FIG. 2A. In some embodiments, server 301 may be located in the same facility as loading facility 101. In other embodiments, server 301 may be located in a remote location, such as a cloud platform or other remote location. In both embodiments, server 301 may be communicatively linked to a 3D depth camera (e.g. TMU 202). Server 301 is arranged to execute computer instructions to perform operations associated with the systems and methods described herein, for example implementing the exemplary operations represented by the block diagrams or flowcharts of the drawings accompanying this description. The server 301 can implement enterprise service software that implement, for example, RESTful (representative state transfer) API services, message queuing service, and event services that can be provided by various platforms or specifications, such as the J2EE specification implemented by the Oracle WebLogic Server platform, the JBoss platform, or the IBM WebSphere platform, etc. Other technologies or platforms, such as Ruby on Rails, Microsoft .NET, or similar can also be used. As described below, the server 301 can be specifically configured to perform operations represented by the block diagrams or flowcharts of the drawings described herein. BE2018 / 5904 The example server 301 of FIG. 3 includes a processor 302 such as, for example, one or more microprocessors, controllers and / or any suitable type of processor. The example server 301 of FIG. 3 further includes memory (e.g., volatile memory or non-volatile memory) 304 accessible to the processor 302, for example, via a memory controller (not shown). The example processor 302 interacts with the memory 304 to obtain, for example, machine-readable instructions stored in the memory 304 corresponding to, for example, the operations represented by the flow charts of this description. Additionally or alternatively, machine readable instructions associated with the sample operations of the block diagrams or flowcharts may be stored on one or more removable media (e.g., a compact disc, a digital versatile disk, removable flash memory, etc.) or over a remote connection such as the Internet or a cloud-based connection, which can be linked to the server 301 to provide access to the machine-readable instructions stored thereon. The example server 301 of FIG. 3 may further include a network interface 306 to enable communication with other machines via, for example, one or more computer networks, such as a local area network (LAN) or a wide area network (WAN), for example the Internet. The example network interface 306 may include any suitable type of communication interface (e.g. wired and / or wireless interfaces) arranged to operate in accordance with any suitable protocol, e.g. Ethernet for wired connections and / or IEEE 802.11 for wireless connections. The example server 301 of FIG. 3 includes input / output (I / O) interfaces 308 to enable reception of user input and communication of output data to the user, which may be a BE2018 / 5904 can include any number of keyboards, mice, USB ports, optical drives, screens, touchscreens, etc. FIG. 4A is a photo-realistic view 402 showing a first embodiment of the vehicle storage space 102s coupled to the loading bay 102d of FIG. 2A wherein the storage space includes an in-range package wall (e.g., package wall 408p1 or package wall 408p2 as described herein). Photorealistic view 402 may be a 2D image as recorded by, for example, photorealistic camera 256 of TMU 202. Photorealistic view 402 may include 2D image data, such as pixel data or RGB data as described herein. Photo-realistic view 402 shows package walls 408p1 and 408p2. As used herein, a package wall can be a stack of packages, boxes or other transportable objects or goods typically involved in distribution and logistics. A package wall can also be a single package that forms the basis for a new package wall. Each of the packages, boxes, or other transportable objects or goods that form a specific package wall may share a common depth, length, or other size such that the specific package wall, as a whole, has at least one uniform or approximately uniform depth, length, or size contains. As shown in photo-realistic view 402, package walls 408p1 and 408p2 are located at the rear of the vehicle storage space 102s, and are within range of the 3D depth camera (e.g., 3D depth camera 254) as described herein. In some embodiments, package walls 408p1 and 408p2 may correspond to one of the packages or package walls 208p1-208p3 of FIG. 2a as described herein. As further shown in photo-realistic view 402, vehicle storage space 102s is defined by surface areas comprising left wall 404l, right wall 404r and floor area 404f. The surface areas, including left wall 404l, right wall 404r, and BE2018 / 5904 floor area 404f, are generally solid or semi-solid surface areas that together form the interior of vehicle storage space 102s. Photo-realistic view 402 additionally shows loading conveyor 406. Loading conveyor 406 can be used to load or unload packages from vehicle storage 102s to loading facility 101 through loading bay 102d, and vice versa. For example, packages 410 and 411 can be loaded or unloaded from vehicle storage space 102s to loading facility 101 through loading compartment 102d. Photo-realistic view 402 also shows additional items that may be contained in the vehicle storage space as well as a worker 412 who can facilitate the loading or unloading of the packages 410, including packages from both of the package walls 408p1 and 408p2. In some embodiments, worker 412 may correspond to worker 212 of FIG. 2a as described herein. FIG. 4B is a histogram 452 of depth data corresponding to the photo-realistic view of FIG. 4A which includes a detected packet wall and other objects located in the field of view of the 3D depth camera (e.g., 3D depth camera 254). The histogram 452 represents a 3D image, and 3D image data, of the vehicle storage space 102s coupled to the loading bay 102d of FIG. 2A where the storage space comprises a detected package wall (e.g. package wall 408p1 and / or 408p2). In essence, the depth data is analyzed via heuristics to detect dominant peaks, determining wall locations based on peaks shown in histogram 452. Trailer-filled calculations are determined by averaging histogram 452 of a current wall (i.e. , a wall closest to the opening of the vehicle storage space 102s to which packages are added) to the end of the trailer. The systems and methods described herein use the histogram 452 of the raw data (as opposed to coagulated data points). It BE2018 / 5904 weight of each 3D point depends on its distance from the 3D depth camera (e.g. 3D depth camera 254), as will be described herein. The histogram 452 includes clusters corresponding to objects placed at various depths within the trailer. For example, histogram 452 includes packet wall peaks 458p1 and 458p2 that correspond to packet walls 408p1 and 408p2. Histogram 452 also shows a worker peak 460 corresponding to worker 412 of the photo-realistic view 402 and placement packet peaks 462p1, 462p2, associated with packets 410 and 411, respectively. With regard to FIGS. 4B and 4C, in some cases, portions of the storage space 102s positioned close to the 3D depth camera 254 may occupy a larger area in the field of view of the 3D depth camera 254 compared to their actual size, and thus may create higher peaks in the histogram 452. For example, peaks 462p1 and 462p2 appear relatively large in size in histogram 452. By weighing or scaling any distance in histogram 452 with a value proportional to its distance from 3D depth camera 254, peaks can be appropriately scaled to take into account distance variations. Accordingly, as shown in FIG. 4C, which shows a scaled histogram 452 'of depth data associated with the photo-realistic view of FIG. 4A, peaks 462p1 and 462p2 have a smaller maximum value when compared to the other peaks (e.g., 458p1). With regard to FIGS. 5A-5G, in some cases, portions of the storage space 102s may be temporarily occupied. As shown, for example, in FIGS. 5A-5C, illustrating historical photo-realistic views of the storage space 102s recorded over a period of time, the storage space 102s may be occupied by the loader 412 and any number of packets 410 passing through the storage space. BE2018 / 5904 102s can move. As illustrated in FIGS. 5D-5F, which represent histogram views of the views, respectively, illustrated in FIGS. 5A-5C, the size of peaks 458p1 and 458p2 associated with package walls 408p1 and 408p2 remains relatively stationary during the time period shown in FIGS. 5A-5C. However, the size and location of peaks 460 and 462p1 associated with the loader 412 and an arbitrary number of packages 410 varies over time as loader movement and the chargers form new package walls. A number of approaches can be used to account for these types of non-wall moving objects. In a first approach, data in consecutive images can be subtracted as a way to detect associated moving objects and remove them to calculate a histogram of only the stationary objects (i.e. package walls). In another approach, data can be concatenated into historical histograms, amplifying peaks associated with stationary objects (i.e., package walls) compared to peaks associated with moving objects. In any of these or other approaches, data recorded in a number of consecutive frames (e.g., 2-5 recorded approximately every 15 seconds, every 30 seconds, every minute, etc.) may be used. In the example of FIG. 5G, which is a modified histogram representation of the histograms illustrated in FIGS. 5D-5F, the packet wall is easily identifiable in that its associated peak is substantially larger and farther than other recorded peaks. However, in some examples, maximum peak heights may be more closely grouped, and thus may require additional calculations to accurately identify a dominant package wall. FIG. 6 is a flowchart of a depth mapping method 600 for use in loading a commercial trailer. Method 600 begins (602) at block 604, where a 3D depth camera (e.g., such as 3D BE2018 / 5904 depth camera 254 of TMU 202) Records 3D image data from a vehicle storage space (e.g. such as vehicle storage space 102s). The 3D depth camera is oriented in a direction to record the 3D image data from the vehicle storage space as described herein. In some embodiments, the 3D image data can be recorded periodically such as, for example, about every 15 seconds, about every 30 seconds, about every minute and / or about every two minutes, etc., but they can be recorded at any time frequency provided by the related 3D depth camera, for example as provided by TMU 202. At block 606, the 3D image data is analyzed by a depth detection application running on one or more processors. In some embodiments, the one or more processors may be processor (s) of the TMU 202, as described herein. In certain embodiments, the 3D depth camera and the one or more processors may be housed in an attachable device, such as TMU 202 shown in FIGS. 2A and 2B. In other embodiments, the one or more processor (s) processor (s) (e.g., processor 302) of the server 301 as described herein. Based on the 3D image data, the depth detection application determines a wall data area and a non-wall data area. At block 608, the depth sensing application generates a wall indicator that indicates a wall located at a separate depth within the vehicle storage space, and can further calculate a wall fill percentage. In one approach and as illustrated in FIGS. 7A-7C, wall data areas can be determined by first calculating a histogram (e.g., histogram 452) of all the depths located in the 3D image. When identifying a number of histogram peaks (e.g., 458p1, 458p2, 460, etc. of FIGS. 7A-7C), a highest peak is identified which in the illustrated example corresponds to BE2018 / 5904 peak 458p1. Then all peaks that are less than a specified threshold (for example, all peaks that are less than 1/6 th of the maximum peak height) are filtered out and no longer considered. The remaining peaks are possible candidates for the current wall. A farthest peak (e.g. peak 458p 1) is then identified and peaks positioned at a distance greater than a specified distance (e.g. 50 ”or the vertical line MD shown in FIGS. 7B and 7B) are also filtered away and not considered longer. Such filtered peaks can represent the loader and / or packages placed on the conveyor for placement purposes. The current wall can then be identified by locating a maximum of the remaining peaks (e.g. peak 458p2). In some examples, slightly smaller peaks (e.g., peak 458p1) may be located behind the current wall and may remain within the specified threshold. In some examples, the relative proximity between peaks 458p1 and 458p2 combined with their comparable height values may result in an incorrect identification of the current wall. Such peaks can represent package walls that are not yet fully formed, which means that additional packages may be placed on top of existing packages. To prevent the systems and methods described herein from alternating between identifying peak 458p 1 and 458p2 as the current wall, which adversely affects the fill calculations, heuristics can be determined to properly identify the correct wall as being the current wall. By observing the distance between the farthest peak and the nearer peak, if the furthest peak is positioned approximately within 50 ”of the nearer peak and has an elevation between about 50% and 99% of the height of the nearer peak, the further peak is considered the dominant peak and accordingly, as the current wall. In some examples, the BE2018 / 5904 appropriate value of the differences between peak heights are chosen based on a function of the inter-peak distances. Other examples of suitable distances between walls and differences between wall heights are possible. In some embodiments, an average wall determination is made. In some examples, the relative proximity between peaks 458p1 and 458p2 combined with their similar height values may result in incorrect identification of the current wall. Furthermore, the systems and methods described herein may "flip-flop" between identifying peak 458p1 and 458p2 as the current wall which may adversely affect the fill calculations. With regard to FIG. 7 C, an "average wall" determination is made considering the entire histogram placed behind the nearest peak candidate (458p2), and a center of gravity is defined by averaging the histogram behind this nearest peak candidate. For example, if the filledness calculation determines that peak 458p1 is about 45% full, and peak 458p2 is about 55% full, an equivalent calculation would correspond to saying that at an average distance (for example, at arrow AVG), the wall is completely full. As a result, filled determinations can be made faster and more accurately. Arranged in this way, the described systems and methods can significantly reduce calculation times. For example, fill calculations can be completed in about 20-25ms. Additionally, the described systems and methods can provide a more consistent and stable filling calculation. In some embodiments, a dashboard application running on a client device (e.g., such as client device 204 of FIG. 2A) can receive the wall indicator. In certain embodiments, the wall indicator can be received by the dashboard application of the TMU device (e.g. the TMU 202). In BE2018 / 5904 other embodiments, the wall indicator can be received by the dashboard application from a server (e.g. server 301). In additional embodiments, the dashboard application may include a vehicle storage space capacity value. As described herein, the vehicle storage capacity capacity value may indicate the filled capacity or filled of a vehicle storage space, such as vehicle storage space 102s. In other embodiments, the vehicle storage capacity capacity value may indicate the remaining capacity or void of a vehicle storage space. In still further embodiments, the dashboard application may include or display a vehicle storage space capacity graph. The vehicle storage capacity capacity graph can graphically indicate one or more vehicle storage capacity value (s) at one or more time value (s). Specific embodiments have been described in the foregoing specification. Several modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be considered in an illustrative rather than a restrictive manner and all such adaptations are intended to be included within the scope of the present teachings. The benefits, solutions to problems, and any element that may lead to any benefit or solution that occurs or is more pronounced should not be interpreted as critical, required, or essential measures or elements of any or all of the claims. The invention is defined only by the appended claims including modifications made during the grant phase of this application and all equivalents of these claims as described. For the sake of clarity and a brief description, features are described herein as part of the same or separate embodiments. BE2018 / 5904 It is noted, however, that the scope of the invention may include embodiments comprising combinations of all or some of the features described. It is further noted that the embodiments shown have similar or similar components, except when described as different. In addition, in this document, relative terms such as first and second, top and bottom and the like can only be used to distinguish one entity or action from another entity or action without necessarily requiring or establishing a genuine such relationship or order between such entities or action. imply. The terms "includes," "includes," "has," "having," "contains," "containing" or any other variation thereof are intended to cover a non-exclusive inclusion, such that a process, process, article, or device that includes, includes, includes not only those elements, but may include other elements not expressly listed or inherent in such process, method, article or device. An element preceded by "includes ... a", "has ... a", "contains ... a" does not exclude, without limitation, the existence of additional identical elements in the process, method, article or has the device comprising the element. The term "one" is defined as one or more, unless expressly stated otherwise herein. The terms "substantially", "essential", "about", or any other version thereof, are defined as being close by, as understood by those skilled in the art, and in a non-limiting embodiment, the term is defined as being within 10%, in another embodiment as being within 5%, in another embodiment as being within 1% and in another embodiment as being within 0.5%. The term "linked" as used herein is defined as connected, although not necessarily direct and not necessarily mechanical. A device or structure that is "configured" in some way is configured on at least that BE2018 / 5904 way, but can also be configured in ways not specified. Some embodiments may include one or more generic or specialized processors (or "processing devices"), such as microprocessors, digital signal processors, customized processors, and field programmable gate arrays (FPGAs) and uniquely stored program instructions (including both software and hardware) that provide the one or control multiple processors, in combination with certain non-processor circuits, to implement some, most, or all functions of the method and / or device described herein. Alternatively, some or all of the functions could be implemented by a state machine that does not have stored program instructions, or in one or more application-specific integrated circuits (ASICs), in which each function or some combinations of certain functions are implemented as custom logic. Obviously, a combination of the two approaches could be used. In addition, an embodiment can be implemented as a computer readable storage medium having stored thereon a computer readable code for programming a computer (e.g. comprising a processor) to perform a method as described and claimed herein. Examples of such computer readable storage media include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (read-only memory), a PROM (programmable read-only memory ), an EPROM (erasable programmable read-only memory), an EEPROM (electrically erasable programmable read-only memory) and a flash memory. The practitioner, despite potentially significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles herein BE2018 / 5904 described will readily be able to generate such software instructions and programs and ICs with minimal experimentation. The summary is provided to enable the reader to quickly understand the nature of the technical description. It is submitted on the assumption that it will not be used to interpret or to limit the scope of the claims. In addition, in the previous figure description, several features may be grouped together in various embodiments for the purpose of streamlining the description. This manner of description is not to be interpreted as reflecting an intention that the claimed embodiments require more measures than expressly stated in any claim. Rather, as the following claims indicate, the inventive subject matter lies in less than all features of a described embodiment. Thus, the following claims are added to the figure description, wherein each claim stands alone as separately claimed matter. The mere fact that certain measures are mentioned in mutually different claims does not indicate that a combination of these measures cannot be used for an advantage. A large number of variants will be clear to the skilled person. All variants are considered to be included within the scope of the invention as defined in the following claims.
权利要求:
Claims (23) [1] CONCLUSIONS A three-dimensional (3D) depth imaging system for use when loading a commercial trailer, the 3D depth imaging system comprising: a 3D depth camera adapted to record 3D image data, the 3D depth camera oriented in a direction to record 3D image data from a vehicle loading space; and a depth detection application performed on one or more processors, wherein the depth detection application, based on the 3D image data, determines at least one wall data area and a non-wall data area, wherein the determination of the wall data area and the non-wall data area the depth detection application prompts the generation of a wall indicator, where the wall indicator indicates that a wall is located at a separate depth within the vehicle loading space. [2] The 3D depth imaging system of claim 1, wherein the 3D depth camera and one or more processors are housed in an attachable device. [3] The 3D depth imaging system of claim 1 or 2, wherein the wall may be one of: a package wall or a wall of the vehicle loading space. [4] The 3D depth imaging system of any of the preceding claims, wherein the non-wall data area is associated with one of: a loader or a vehicle loading space deployment area. [5] The 3D depth mapping system of any preceding claim, wherein the non-wall data area is determined by comparing historical 3D image data to identify at least one peak area. BE2018 / 5904 [6] The 3D depth mapping system of any of the preceding claims, wherein the depth detecting application is arranged to ignore non-wall data areas when generating the wall indicator. [7] The 3D depth mapping system of any of the preceding claims, further comprising a dashboard application, wherein the dashboard application is executed on a client device, and wherein the determination of the wall data area and the non-wall data area further enables the dashboard app. to receive the wall indicator. [8] The 3D depth mapping system of claim 7, wherein the dashboard application comprises a vehicle cargo space capacity value, the vehicle cargo space capacity value indicating a filled capacity of the vehicle cargo space. [9] The 3D depth mapping system of claim 7 or 8, wherein the dashboard application includes a vehicle cargo space capacity value, the vehicle cargo space capacity value indicating a residual capacity of the vehicle cargo space. [10] The 3D depth imaging system of any of the preceding claims, wherein the 3D image data is 3D point cloud data. [11] The 3D depth imaging system of any of the preceding claims, wherein the 3D image data is periodically recorded. [12] The 3D depth imaging system of claim 11, wherein the recorded 3D image data is recorded in one of the following: approximately every 15 seconds, approximately every 30 seconds, approximately every minute, or approximately every two minutes. [13] 13. A three-dimensional (3D) depth mapping method for use in loading a commercial trailer, the 3D depth mapping method comprising: BE2018 / 5904 recording, via a 3D depth camera, 3D image data of a vehicle loading space, wherein the 3D depth camera is oriented in a direction to record the 3D image data of the vehicle loading space; analyzing, via one or more processors, the 3D image data with a depth detection application, wherein the depth detection application, based on the 3D image data, determines at least one wall data area and a non-wall data area; and generating, via the depth detection application, a wall indicator, the wall indicator indicating that a wall has been detected at a separate depth within the vehicle loading space. [14] The 3D depth imaging method of claim 13, wherein analyzing the 3D image data comprises: calculating a histogram of the depths of the 3D image data; identifying a plurality of histogram peaks; identifying a highest peak from the multiple number of histogram peaks and filtering the multiple number of histogram peaks having a height that is outside a threshold of the highest peak; and identifying a peak from the multiple number of histogram peaks furthest from the 3D depth camera and filtering the remaining multiple number of histogram peaks located at a distance from the peak furthest from the 3D depth camera has. [15] The 3D depth mapping method of claim 14, further comprising determining whether the peak having the furthest distance from the 3D depth camera is the current wall. [16] The 3D depth mapping method of claim 14 or 15, wherein analyzing the 3D image data further identifies one BE2018 / 5904 includes the highest residual peak from the residual multiple number of histogram peaks and the generation of a wall indicator. [17] The 3D depth mapping method of any of the preceding claims 14-16, further comprising calculating an average wall placed between the wall indicator and a total depth of the vehicle loading space. [18] The 3D depth mapping method of any of the preceding claims 13-17, wherein the non-wall data area is associated with one of: a loader or a vehicle loading space collection location. [19] The 3D depth mapping method of any of the preceding claims 13-18, wherein the non-wall data area is determined by comparing historical 3D image data to identify at least one peak region. [20] The 3D depth mapping method of any of the preceding claims 13-19, wherein the depth detecting application is set to ignore non-wall data areas when generating the wall indicator. [21] The 3D depth mapping method of any of the preceding claims 13-20, further comprising receiving, via a dashboard application executed on a client device, the wall indicator. [22] The 3D depth mapping method of claim 21, wherein the dashboard application includes a vehicle cargo space capacity value, the vehicle cargo space capacity value indicating a filled capacity of the vehicle cargo space. [23] The 3D depth mapping method of any of the preceding claims 13-22, wherein the 3D image data is recorded at one of: approximately every 15 seconds, approximately every 30 seconds, approximately every minute, or approximately every two minutes.
类似技术:
公开号 | 公开日 | 专利标题 BE1025931B1|2020-04-01|Systems and methods for determining the filling of a commercial trailer BE1025929B1|2020-01-06|Container usage estimate JP2009126660A|2009-06-11|Article management system and information processing apparatus US10643337B2|2020-05-05|Systems and methods for segmenting and tracking package walls in commercial trailer loading US10628772B2|2020-04-21|Computing package wall density in commercial trailer loading AU2018391965B2|2021-11-25|Container loading/unloading time estimation US20190199999A1|2019-06-27|Systems and methods for detecting if package walls are beyond 3d depth camera range in commercial trailer loading BE1025930B1|2019-11-27|Container Auto-dimensioning NL2022243B1|2021-09-23|Trailer door monitoring and reporting US10687045B2|2020-06-16|Systems and methods for idle time in commercial trailer loading KR20170067406A|2017-06-16|Method and system for analyzing quantity of load based on image information US20200202544A1|2020-06-25|System and method for detecting a presence or absence of objects in a trailer US20210264630A1|2021-08-26|Three-dimensional | imaging systems and methods for virtual grading of package walls in commerical trailer loading US11125598B1|2021-09-21|Three-dimensional | imaging systems and methods for determining vehicle storage areas and vehicle door statuses US11009604B1|2021-05-18|Methods for detecting if a time of flight | sensor is looking into a container US20210304414A1|2021-09-30|Methods for unit load device | door tarp detection US20210256682A1|2021-08-19|Three-dimensional | imaging systems and methods for detecting and dimensioning a vehicle storage area NL2026758B1|2022-02-22|Three-dimensional | depth imaging systems and methods for automatically determining shipping container fullness based on imaging templates CN113469871A|2021-10-01|Carriage loadable space detection method and device based on three-dimensional laser
同族专利:
公开号 | 公开日 AU2018388705B2|2021-01-28| DE112018006554T5|2020-08-27| AU2018388705A1|2020-03-26| US10657666B2|2020-05-19| US20190197716A1|2019-06-27| GB202009204D0|2020-07-29| BE1025931A1|2019-08-09| WO2019125660A1|2019-06-27| GB2583248A|2020-10-21|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 WO2015187975A1|2014-06-04|2015-12-10|Intelligrated Headquarters Llc|Truck unloader visualization| WO2017087244A1|2015-11-18|2017-05-26|Symbol Technologies, Llc|Methods and systems for container fullness estimation| WO2005086078A1|2004-03-02|2005-09-15|Sarnoff Corporation|Method and apparatus for classifying an object| US7339460B2|2005-03-02|2008-03-04|Qualcomm Incorporated|Method and apparatus for detecting cargo state in a delivery vehicle| EP1989513B1|2006-02-27|2013-12-11|Sikorsky Aircraft Corporation|Aircraft load management system for interior loads| EP2020174B1|2007-08-03|2012-02-29|AGROCOM GmbH & Co. Agrarsystem KG|Agricultural working machine| US9646383B2|2011-12-19|2017-05-09|Sharp Kabushiki Kaisha|Image processing apparatus, image capturing apparatus, and display apparatus| US10158842B2|2012-06-20|2018-12-18|Honeywell International Inc.|Cargo sensing detection system using spatial data| CN104662389B|2012-09-21|2017-06-16|株式会社多田野|The peripheral information acquisition device of Operation Van| US20140372182A1|2013-06-17|2014-12-18|Motorola Solutions, Inc.|Real-time trailer utilization measurement| US9460524B1|2014-05-30|2016-10-04|Amazon Technologies, Inc.|Estimating available volumes using imaging data| JP6373111B2|2014-07-25|2018-08-15|西日本高速道路エンジニアリング四国株式会社|Tunnel lining surface inspection system and vehicle used for tunnel lining surface inspection system| JP6457278B2|2015-01-23|2019-01-23|トヨタ自動車株式会社|Object detection apparatus and object detection method| EP3782926A1|2015-02-18|2021-02-24|Fedex Corporate Services, Inc.|Managing logistics information related to a logistics container using a container interface display apparatus| US20160297361A1|2015-04-08|2016-10-13|Jeffrey M. Drazan|Camera array system and method to detect a load status of a semi- trailer truck| CN107076541A|2015-06-11|2017-08-18|松下知识产权经营株式会社|Dimension measurement device and method for measuring dimension| US10713610B2|2015-12-22|2020-07-14|Symbol Technologies, Llc|Methods and systems for occlusion detection and data correction for container-fullness estimation| US10261747B2|2016-09-09|2019-04-16|The Boeing Company|Synchronized side-by-side display of live video and corresponding virtual environment images| JP6880950B2|2017-04-05|2021-06-02|村田機械株式会社|Depression detection device, transfer device, and recess detection method| WO2018187327A1|2017-04-07|2018-10-11|Walmart Apollo, Llc|Systems, devices, and methods for automated pallet and receptacle relocation| JP6917789B2|2017-06-09|2021-08-11|株式会社東芝|Cargo handling device and operation method of cargo handling device|WO2019147796A1|2018-01-26|2019-08-01|Bumblebee Spaces Inc.|Hoist system with household object payload motion control utilizing ambient depth data| US10803620B2|2018-12-12|2020-10-13|Zebra Technologies Corporation|Method and apparatus for calibrating trailer monitoring units| USD936523S1|2020-05-14|2021-11-23|Jonathan Reynolds|Lighted guide post for boat trailer|
法律状态:
2020-05-29| FG| Patent granted|Effective date: 20200401 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US15/853,293|US10657666B2|2017-12-22|2017-12-22|Systems and methods for determining commercial trailer fullness| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|